Sipser–Lautemann theorem

In computational complexity theory, the Sipser–Lautemann theorem or Sipser–Gács–Lautemann theorem states that Bounded-error Probabilistic Polynomial (BPP) time, is contained in the polynomial time hierarchy, and more specifically Σ2 ∩ Π2.

In 1983, Michael Sipser showed that BPP is contained in the polynomial time hierarchy. Péter Gács showed that BPP is actually contained in Σ2 ∩ Π2. Clemens Lautemann contributed by giving a simple proof of BPP’s membership in Σ2 ∩ Π2, also in 1983. It is conjectured that in fact BPP=P, which is a much stronger statement than the Sipser–Lautemann theorem.

Contents

Proof

Michael Sipser's version of the proof works as follows. Without loss of generality, a machine M ∈ BPP with error ≤ 2-|x| can be chosen. (All BPP problems can be amplified to reduce the error probability exponentially.) The basic idea of the proof is to define a Σ2 ∩ Π2 sentence that is equivalent to stating that x is in the language, L, defined by M by using a set of transforms of the random variable inputs.

Since the output of M depends on random input, as well as the input x, it is useful to define which random strings produce the correct output as A(x) = {r | M(x,r) accepts}. The key to the proof is to note that when xL, A(x) is very large and when xL, A(x) is very small. By using bitwise parity, ⊕, a set of transforms can be defined as A(x) ⊕ t={rt | rA(x)}. The first main lemma of the proof shows that the union of a small finite number of these transforms will contain the entire space of random input strings. Using this fact, a Σ2 sentence and a Π2 sentence can be generated that is true if and only if xL (see corollary).

Lemma 1

The general idea of lemma one is to prove that if A(x) covers a large part of the random space R= \{ 1,0 \} ^ {|r|} then there exists a small set of translations that will cover the entire random space. In more mathematical language:

If \frac{|A(x)|}{|R|} > 1 - \frac{1}{2^{|x|}}, then \exists  t_1,t_2,\ldots,t_{|r|}, where t_i \in \{ 1,0 \} ^{|r|} \ such that  \bigcup_i A(x) \oplus t_i = R.

Proof. Randomly pick t1, t2, ..., t|r|. Let S=\bigcup_i A(x)\oplus t_i (the union of all transforms of A(x)).

So, for all r in R,

\Pr [r \notin S] = \Pr [r \notin A(x) \oplus t_1] \cdot \Pr [r \notin A(x) \oplus t_2] \cdots \Pr [r \notin A(x) \oplus t_{|r|}] \le { \frac{1}{2^{|x| \cdot |r|}} }.

The probability that there will exist at least one element in R not in S is

\Pr \Bigl[ \bigvee_i (r_i \notin S)\Bigr] \le \sum_i \frac{1}{2^{|x| \cdot |r|}} = \frac{1}{2^{|x|}} < 1.

Therefore

\Pr [S = R] \ge 1 - \frac{1}{2^{|x|}}.

Thus there is a selection for each t_1,t_2,\ldots,t_{|r|} such that

 \bigcup_i A(x) \oplus t_i = R.

Lemma 2

The previous lemma shows that A(x) can cover every possible point in the space using a small set of translations. Complementary to this, for xL only a small fraction of the space is covered by A(x). Therefore the set of random strings causing M(x,r) to accept cannot be generated by a small set of vectors ti.

R = \bigcup_i A(x) \oplus t_i

R is the set of all accepting random strings, exclusive-or'd with vectors ti.

\frac{|A(x)|}{|R|} \le  \frac{1}{2^{|k|}} \implies \neg \exists t_1,t_2,\dots,t_{r}

Corollary

An important corollary of the lemmas shows that the result of the proof can be expressed as a Σ2 expression, as follows.

x \in L \iff \exists t_1,t_2,\dots,t_{|r|}\, \forall r \in R \bigvee_{ 1 \le i \le |r|} (M(r \oplus t_i) \text{ accepts}).

That is, x is in language L if and only if there exist |r| binary vectors, where for all random bit vectors r, TM M accepts at least one random vector ⊕ ti.

The above expression is in Σ2 in that it is first existentially then universally quantified. Therefore BPP ∈ Σ2. Because BPP is closed under complement, this proves BPP ∈ Σ2 ∩ Π2

Lautemann's proof

Here we present the proof (due to Lautemann) that BPP ∈ Σ2. See Trevisan's notes for more information.

Lemma 3

Based on the definition of BPP we show the following:

If L is in BPP then there is an algorithm A such that for every x,

{\rm Pr}_r(A(x,r) = \mbox{right answer}) \ge 1 - \frac{1}{3m},

where m is the number of random bits |r| = m = |x|^{O(1)} and A runs in time |x|^{O(1)}.

Proof: Let A' be a BPP algorithm for L. For every x, \Pr_r(A'(x,r) = \mbox{wrong answer}) \le 1/3. A' uses m'(n) random bits where n = |x|. Do k(n) repetitions of A' and accept if and only if at least k(n)/2 executions of A' accept. Define this new algorithm as A. So A uses k(n)m'(n) random bits and

{\rm Pr}_r(A(x,r) = \mbox{wrong answer}) \le 2^{-ck(n)}.

We can then find k(n) with k(n)=\Theta (\log m'(n)) such that

\frac{1}{2^{ck(n)}} \le \frac{1}{3k(n)m'(n)}.

Theorem 1

Proof: Let L be in BPP and A as in Lemma 3. We want to show

x \in L \iff \exists y_1,\dots,y_m \in \{0,1\}^m\, \forall z \in \{0,1\}^m \bigvee_{i=1}^mA(x,y_i \oplus z)=1.

where m is the number of random bits used by A on input x. Given x \in L, then

\begin{align}
  {\rm Pr}_{y_1,\dots,y_m}(\exists z& A(x,y_1 \oplus z)=\dots=A(x,y_m \oplus z)=0)\\
      &\le \sum_{z \in \{0,1\}^m} {\rm Pr}_{y_1,\dots,y_m}(A(x,y_1 \oplus z) = \dots = A( x, y_m \oplus z) = 0)\\
      &\le 2^m \frac{1}{(3m)^m}\\
      &< 1.
\end{align}

Thus

{\rm Pr}_{y_1,\dots,y_m}\Bigl( \forall z \bigvee_i A(x,y_i \oplus z)\Bigr)=1 - {\rm Pr}_{y_1,...,y_m}(\exists z A(x,y_1 \oplus z)=\dots=A(x,y_m \oplus z)=0).

Thus (y_1,\dots,y_m) exists.

Conversely, suppose x \notin L. Then

{\rm Pr}_z \Bigl( \bigvee_i A(x,y_i \oplus z) \Bigr) \le \sum_i {\rm Pr}_z (A(x,y_i \oplus z)=1)\le m \frac{1}{3m}= \frac{1}{3}.

Thus

{\rm Pr}_z(A(x,y_1 \oplus z)=\dots=A(x,y_m \oplus z)=0)= 1 - {\rm Pr}_z \Bigl( \bigvee_i A(x,y_i \oplus z) \Bigr)\ge \frac{2}{3} > 0.

Thus there is a z such that  \bigvee_i A(x,y_i \oplus z)=0 for all y_1,...,y_m \in \{0,1\}^m.

Stronger version

The theorem can be strenghtened to \mathsf{BPP} \subseteq \mathsf{MA} \subseteq \mathsf{S}_2^P \subseteq \Sigma_2 \cap \Pi_2 (see MA, SP
2
).

References